6 - Applications of ODENets to time series [ID:60593]
46 von 46 angezeigt

Applications of ODE net on time series.

Here we will talk about an example provided in

the original paper and their algorithm for solving for time series.

So let's say we

have observations depicting the stock market or any other time series data at time steps

We then sample from that Gaussian a ZT0 that will serve as the initial value for your ODE solve

to then propagate for all the time steps from T1 to Tn.

Now that we have this T0 as the initial value we perform an ODE solve to get the latent space

representations of all the other time steps.

That's from ZT1 to ZTn.

We then decode them

in from the latent space to the data space to get our predictions from T1 to Tn and then it's pretty

straightforward to then calculate the reconstruction loss from the observed and predicted values.

What is happening here is we are trying to learn the patterns in your time series and trying to

approximate it to the function of differential in your ODE solve.

The advantage of this is that

once trained is that we can extrapolate the same ODE solve plus decode to a time step in the future.

This method has theoretically no limitations when it comes to irregularly sample data or extrapolation

for a different delta t which is exactly what we want from a time aware deep learning model

because it is learning how the phenomena is changing with respect to time as a continuous

function as opposed to a discrete formulation.

A future paper from the same group suggested a

latent ODE model to specifically combat irregularly sample time series.

They propose

getting latent representations of the time series data through the GRU wherever it is irregularly

sampled.

You first have a Tn and then perhaps then you have you have move on to the GRU to the next

one and then with the GRU you move on to the next one and wherever you don't have any information

then you use an ODE to actually get to T1 and then eventually to T0 and that's being encoded

and that is going through your ODE solve to finally get decoded in the end.

The remaining

structure as you can see more or less remains the same.

That brings us to the end of this lecture.

We learned about time aware networks.

We learned about neural ODEs where an ODE solver is equated

to a forward propagation through a neural network.

We learned about the adjoint method that is used

to solve this back propagation problem.

We also learned how these models are being used for

irregularly sample time series as well as irregular inference on such deep learning models.

I hope this was a useful lecture in learning more about time aware neural networks.

I would

highly recommend going through all the references that I've mentioned and also some tutorials to

get more hands-on experience with such of such models.

Thank you and have a great day.

Thank you.

Teil eines Kapitels:
Time-aware models

Zugänglich über

Offener Zugang

Dauer

00:04:28 Min

Aufnahmedatum

2025-11-04

Hochgeladen am

2025-11-04 16:05:12

Sprache

en-US

Examples of Applications of ODENets to time series.